Multilayer Perceptrons May Learn Simple Rules Quickly

نویسنده

  • R Urbanczik
چکیده

Zero temperature Gibbs learning is considered for a connected committee machine with K hidden units. For large K, the scale of the learning curve strongly depends on the target rule. When learning a perceptron, the sample size P needed for optimal generalization scales so that N P KN, where N is the dimension of the input. This even holds for a noisy perceptron rule if a new input is classiied by the majority vote of all students in the version space. When learning a committee machine with M hidden units, 1 M K, optimal generalization requires p MKN P.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Feature Selection Using a Multilayer Perceptron

The problem of selecting the best set of features for target recognition using a multilayer perceptron is addressed in this paper. A technique has been developed which analyzes the weights in a multilayer perceptron to determine which features the network finds important and which are unimportant. A brief introduction to the use of multilayer perceptrons for classification and the training rule...

متن کامل

Ensemble of Linear Experts as an Interpretable Piecewise-linear Classifier

In this study we propose a new ensemble model composed of several linear perceptrons. The objective of this study is to build a piecewise-linear classifier that is not only competitive to Multilayer Perceptrons(MLP) in generalization performance but also interpretable in the form of human-comprehensible rules. We present a simple competitive training method that allows the ensemble to effective...

متن کامل

Empirical Determination of Sample Sizes for Multi-layer Perceptrons by Simple RBF Networks

It’s well known that the computing time to train multilayer perceptrons is very long because of weight space of the neural networks and small amount of adjustment of the wiights for convergence. The matter becomes worse when the size of training data set is large, which is common in data mining tasks. Moreover, depending on samples, the performance of neural networks change. So, in order to det...

متن کامل

Local linear perceptrons for classification

A structure composed of local linear perceptrons for approximating global class discriminants is investigated. Such local linear models may be combined in a cooperative or competitive way. In the cooperative model, a weighted sum of the outputs of the local perceptrons is computed where the weight is a function of the distance between the input and the position of the local perceptron. In the c...

متن کامل

Efficient estimation of multidimensional regression model using multilayer perceptrons

This work concerns the estimation of multidimensional nonlinear regression models using multilayer perceptrons (MLPs). The main problem with such models is that we need to know the covariance matrix of the noise to get an optimal estimator. However, we show in this paper that if we choose as the cost function the logarithm of the determinant of the empirical error covariance matrix, then we get...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1997